Cloud Operations - Logging
Introduction to cloud operations services
We'll cover the following
Cloud Operations is the stack of services that helps in monitoring, logging, debugging, and remove latencies from the APIs or the google cloud services. We have already seen the Monitoring service from the cloud operations stack. In this lesson, we will understand the rest of the services.
Logging#
Cloud Logging is the centralized log server for all the resources hosted on Google Cloud. All types of logs including the network and access logs are captured and stored in the logging service.
How does this work? What is the architecture that supports this type of centralized logging mechanism? Let’s have a look at Cloud Logging and its components.
The architecture is pretty simple and straightforward. All types of logs of resources are forwarded to the “Cloud Logging API”. At the backend of the logging API, the “Log router” determines if a log should be stored, excluded, or exported based on the filters applied to it.
You can exclude the log from storing and export it to the event processing system at the same time because of parallel filters. “Audit logs” and “Network logs” are the most frequent types of logs which are exported to the long-term storage classes in cloud storage buckets.
Log viewer#
To access the log viewer, open the main menu > Cloud Operations > Logging > logs viewer.
-
Logs viewer provides the interface to query logs to see the logs of the specific resource.
-
You can use the select menu options to filter out resources or directly write string queries.
Logs dashboard#
Logs dashboard is useful to have at a glance view of all the services. It shows the logs by severity levels. There are different severity levels of logs. The most common are:
- Default
- Error
- Info
- Notice
- Debug
You can also download the charts for any presentations or for any other use.
Logs-based metrics#
This window shows any predefined or user-defined logs metrics. These metrics count the number of specific logs falling under certain filters created for logs. We can also create alerts based on these metrics.
Log router#
Log router is used to create log sinks to the particular export locations. As shown in the diagram you can export logs to the different tools using the cloud router.
To create the sink click on the Create Sink button and fill the form.
Log storage#
This window shows the location and bucket details of the log storage. Log storage is charged $0.5/GB once you exceeded the 50GB/project free quota.
Audit logs are stored in a separate bucket and have a maximum retention period of 400 Days. For the rest of the logs, the retention period is 30 days. So, after 30 days your logs will be no longer accessible.
You can create an alert for that so that once you exceed the 50 GB mark you can apply exclusions to not store the irrelevant logs.
Find out the Quotas and Limits for logging here.
Cloud PubSub
Cloud Operations - Other Services